282 research outputs found

    Unsupervised Discovery of Phonological Categories through Supervised Learning of Morphological Rules

    Full text link
    We describe a case study in the application of {\em symbolic machine learning} techniques for the discovery of linguistic rules and categories. A supervised rule induction algorithm is used to learn to predict the correct diminutive suffix given the phonological representation of Dutch nouns. The system produces rules which are comparable to rules proposed by linguists. Furthermore, in the process of learning this morphological task, the phonemes used are grouped into phonologically relevant categories. We discuss the relevance of our method for linguistics and language technology

    Growth Management Policies for Exurban and Suburban Development: Theory and an Application to Sonoma County, California

    Get PDF
    This study examines the effectiveness of growth management policies on influencing future patterns of exurban and suburban development. We initially estimate a spatially explicit model of residential development with parcel data in Sonoma County, California. This estimated model is then used to simulate the effect of urban growth boundaries (UGBs) versus allowing municipal sewer service expansion. The UGB policy decreases the amount of suburban development but is less effective in managing exurban development. The downzoning policy in agricultural and resource areas reduces the amount of exurban development, but only partially due to the prevalence of grandfathered lots in rural areas.exurban development, urban growth boundaries, sprawl, spatial modeling, urban fringe, Land Economics/Use,

    An Analysis of the World’s Environment and Population Dynamics with Varying Carrying Capacity, Concerns and Skepticism

    Get PDF
    Due to the open-access nature of the environment we consider an ad hoc adjustment of people’s footprints to the quality of the environment. The adjustment is due to concerns, but hindered by skepticism about announced changes in the state of the environment. Changes in the quality of the environment affect Earth’s carrying capacity. By expanding the Lotka-Volterra predator-prey model to include these features we show that despite skepticism the environment-population system does not collapse. We also show that in the ideal case of no skepticism, the interplay between the non-optimally changing environmental concerns and carrying capacity sends the world’s environment and human population on an oscillating course that leads to a unique interior steady state. These results require no further technological, social or international progress.Environment; Population; Carrying Capacity; Concerns; Skepticism

    SPATIAL TARGETING STRATEGIES FOR LAND CONSERVATION

    Get PDF
    Purchasing development rights is a major mechanism for the protection of environmental quality and landscape amenities. This paper provides a targeting strategy for protecting multiple environmental benefits that takes into account land costs and probability of land use conversion. We compare two strategies. Subject to a budget constraint on parcel purchases, the standard strategy is to target parcels with the highest ratio of environmental benefits to land costs. The standard strategy selects parcels even if there is little probability that the parcel would otherwise be converted. Our new strategy targets parcels to minimize the benefit loss from land conversion, which weights parcel based on initial benefit endowment and expected probability of land use conversion. The empirical analysis focuses on targeting conservation easements in the exurban region of Sonoma County, CA, in which extensively-managed, developable parcels (i.e. pasture and forest areas) with environmental benefits are being converted to residential use and vineyards. Spatially-explicit modeling approaches are employed to estimate land values and likelihood of land use conversion, according to heterogeneous parcel site characteristics, for all developable parcels. Our results indicate that benefit-cost targeting is biased toward low cost parcels, since it ignores the variation in likelihood of future land use conversion. This inefficiency for benefit-cost targeting arises from the positive relationship that typically exists between likelihood of land use change and value of development rights. Hence, some parcels with poor land quality or remote accessibility to urban centers would have de facto conservation, and therefore do not warrant targeting of conservation funds, despite the low cost of protection. Our new targeting strategy balances the countervailing factors of land values and likelihood of land use conversion.Land Economics/Use,

    Habitat and open space at risk and the prioritization of conservation easements

    Get PDF
    Funds available to purchase land and easements for conservation purposes are limited. This article provides a targeting strategy for protecting multiple environmental benefits that includes heterogeneity in land costs and probability of land-use conversion, by incorporating spatially explicit land-use change and hedonic price models. This strategy is compared to two alternative strategies that omit either land cost or conversion threat. Based on dynamic programming and Monte Carlo simulations with alternating periods of conservation and development, we demonstrate that the positive correlation between land costs and probability of land-use conversion affects targeting efficiency using parcel data from Sonoma County, California.Environmental Economics and Policy,

    MBT: A Memory-Based Part of Speech Tagger-Generator

    Full text link
    We introduce a memory-based approach to part of speech tagging. Memory-based learning is a form of supervised learning based on similarity-based reasoning. The part of speech tag of a word in a particular context is extrapolated from the most similar cases held in memory. Supervised learning approaches are useful when a tagged corpus is available as an example of the desired output of the tagger. Based on such a corpus, the tagger-generator automatically builds a tagger which is able to tag new text the same way, diminishing development time for the construction of a tagger considerably. Memory-based tagging shares this advantage with other statistical or machine learning approaches. Additional advantages specific to a memory-based approach include (i) the relatively small tagged corpus size sufficient for training, (ii) incremental learning, (iii) explanation capabilities, (iv) flexible integration of information in case representations, (v) its non-parametric nature, (vi) reasonably good results on unknown words without morphological analysis, and (vii) fast learning and tagging. In this paper we show that a large-scale application of the memory-based approach is feasible: we obtain a tagging accuracy that is on a par with that of known statistical approaches, and with attractive space and time complexity properties when using {\em IGTree}, a tree-based formalism for indexing and searching huge case bases.} The use of IGTree has as additional advantage that optimal context size for disambiguation is dynamically computed.Comment: 14 pages, 2 Postscript figure

    Beyond Tax Smoothing

    Get PDF
    Analyses of optimal government capital structure generally follow Bohn (1990) and Barro (1995) in assuming risk neutrality or an exogenous risk premium. These analyses usually conclude that the optimal government capital structure stabilizes tax rates over time and states of nature to the greatest extent possible, something known as "tax smoothing." In this paper, we show that when an endogenous risk premium is introduced, the optimal government capital structure will no longer smooth tax rates. Under likely conditions, the optimal structure requires a larger short position in risky assets than that implied by tax smoothin

    The Case of Markets versus Standards for Pollution Policy

    Get PDF

    Natural resources in a competitive economy.

    Get PDF
    Thesis. 1976. Ph.D.--Massachusetts Institute of Technology. Dept. of Economics.Microfiche copy available in Archives and Dewey.Includes bibliographical references.Ph.D

    Incorporating anthropogenic influences into fire probability models : effects of human activity and climate change on fire activity in California

    Get PDF
    The costly interactions between humans and wildfires throughout California demonstrate the need to understand the relationships between them, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires, with previously published estimates of increases ranging from nine to fifty-three percent by the end of the century. Our goal is to assess the role of climate and anthropogenic influences on the state's fire regimes from 1975 to 2050. We develop an empirical model that integrates estimates of biophysical indicators relevant to plant communities and anthropogenic influences at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of explanatory power in the model. We also find that the total area burned is likely to increase, with burned area expected to increase by 2.2 and 5.0 percent by 2050 under climatic bookends (PCM and GFDL climate models, respectively). Our two climate models show considerable agreement, but due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid inland deserts and coastal areas of the south. Given the strength of human-related variables in some regions, however, it is clear that comprehensive projections of future fire activity should include both anthropogenic and biophysical influences. Previous findings of substantially increased numbers of fires and burned area for California may be tied to omitted variable bias from the exclusion of human influences. The omission of anthropogenic variables in our model would overstate the importance of climatic ones by at least 24%. As such, the failure to include anthropogenic effects in many models likely overstates the response of wildfire to climatic change
    • 

    corecore